internal noise
Stochastic Optimal Control and Estimation with Multiplicative and Internal Noise
A pivotal brain computation relies on the ability to sustain perception-action loops. Stochastic optimal control theory offers a mathematical framework to explain these processes at the algorithmic level through optimality principles. However, incorporating a realistic noise model of the sensorimotor system -- accounting for multiplicative noise in feedback and motor output, as well as internal noise in estimation -- makes the problem challenging. Currently, the algorithm that is commonly used is the one proposed in the seminal study in (Todorov, 2005). After discovering some pitfalls in the original derivation, i.e., unbiased estimation does not hold, we improve the algorithm by proposing an efficient gradient descent-based optimization that minimizes the cost-to-go while only imposing linearity of the control law. The optimal solution is obtained by iteratively propagating in closed form the sufficient statistics to compute the expected cost and then minimizing this cost with respect to the filter and control gains. We demonstrate that this approach results in a significantly lower overall cost than current state-of-the-art solutions, particularly in the presence of internal noise, though the improvement is present in other circumstances as well, with theoretical explanations for this enhanced performance. Providing the optimal control law is key for inverse control inference, especially in explaining behavioral data under rationality assumptions.
- North America > United States > California (0.14)
- North America > United States > Massachusetts (0.04)
- Europe > Denmark (0.04)
- Asia > Middle East > Jordan (0.04)
- North America > Canada (0.04)
- Asia > Middle East > Jordan (0.04)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (1.00)
- Information Technology > Artificial Intelligence > Cognitive Science (1.00)
- Information Technology > Communications > Networks (0.93)
- Information Technology > Artificial Intelligence > Representation & Reasoning (0.68)
- North America > United States > California (0.14)
- North America > United States > Massachusetts (0.04)
- Europe > Denmark (0.04)
- Asia > Middle East > Jordan (0.04)
- North America > Canada (0.04)
- Asia > Middle East > Jordan (0.04)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (1.00)
- Information Technology > Artificial Intelligence > Cognitive Science (1.00)
- Information Technology > Communications > Networks (0.93)
- Information Technology > Artificial Intelligence > Representation & Reasoning (0.68)
Impact of white noise in artificial neural networks trained for classification: performance and noise mitigation strategies
Semenova, Nadezhda, Brunner, Daniel
In recent years, the hardware implementation of neural networks, leveraging physical coupling and analog neurons has substantially increased in relevance. Such nonlinear and complex physical networks provide significant advantages in speed and energy efficiency, but are potentially susceptible to internal noise when compared to digital emulations of such networks. In this work, we consider how additive and multiplicative Gaussian white noise on the neuronal level can affect the accuracy of the network when applied for specific tasks and including a softmax function in the readout layer. We adapt several noise reduction techniques to the essential setting of classification tasks, which represent a large fraction of neural network computing. We find that these adjusted concepts are highly effective in mitigating the detrimental impact of noise.
- Europe > Russia > Volga Federal District > Saratov Oblast > Saratov (0.04)
- North America > United States > Indiana > Madison County > Anderson (0.04)
- North America > Canada > British Columbia > Metro Vancouver Regional District > Vancouver (0.04)
- (3 more...)
Noise-Enhanced Associative Memories
Recent advances in associative memory design through structured pattern sets and graph-based inference algorithms allow reliable learning and recall of exponential numbers of patterns. Though these designs correct external errors in recall, they assume neurons compute noiselessly, in contrast to highly variable neurons in hippocampus and olfactory cortex. Here we consider associative memories with noisy internal computations and analytically characterize performance. As long as internal noise is less than a specified threshold, error probability in the recall phase can be made exceedingly small. More surprisingly, we show internal noise actually improves performance of the recall phase. Computational experiments lend additional support to our theoretical analysis. This work suggests a functional benefit to noisy neurons in biological neuronal networks.
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.14)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Europe > Switzerland > Vaud > Lausanne (0.04)
- (3 more...)
Can We Transfer Noise Patterns? A Multi-environment Spectrum Analysis Model Using Generated Cases
Du, Haiwen, Ju, Zheng, An, Yu, Du, Honghui, Zhu, Dongjie, Tian, Zhaoshuo, Lawlor, Aonghus, Dong, Ruihai
Spectrum analysis systems in online water quality testing are designed to detect types and concentrations of pollutants and enable regulatory agencies to respond promptly to pollution incidents. However, spectral data-based testing devices suffer from complex noise patterns when deployed in non-laboratory environments. To make the analysis model applicable to more environments, we propose a noise patterns transferring model, which takes the spectrum of standard water samples in different environments as cases and learns the differences in their noise patterns, thus enabling noise patterns to transfer to unknown samples. Unfortunately, the inevitable sample-level baseline noise makes the model unable to obtain the paired data that only differ in dataset-level environmental noise. To address the problem, we generate a sample-to-sample case-base to exclude the interference of sample-level noise on dataset-level noise learning, enhancing the system's learning performance. Experiments on spectral data with different background noises demonstrate the good noise-transferring ability of the proposed method against baseline systems ranging from wavelet denoising, deep neural networks, and generative models. From this research, we posit that our method can enhance the performance of DL models by generating high-quality cases. The source code is made publicly available online at https://github.com/Magnomic/CNST.
- Asia > China > Heilongjiang Province > Harbin (0.04)
- Europe > Sweden > Stockholm > Stockholm (0.04)
- Europe > Ireland > Leinster > County Dublin > Dublin (0.04)
- (2 more...)
Replay attack spoofing detection system using replay noise by multi-task learning
Shim, Hye-Jin, Jung, Jee-weon, Heo, Hee-Soo, Yoon, Sunghyun, Yu, Ha-Jin
In this paper, we propose a spoofing detection system for replay attack using replay noise. In many previous studies across various domains, noise has been reduced. However, in replay attack, we hypothesize that noise is the prominent feature which is different with original signal and it can be one of the keys to find whether a signal has been spoofed. We define the noise that is caused by the replay attack as replay noise. Specifically, the noise of playback devices, recording environments, and recording devices, is included in the replay noise. We explore the effectiveness of training a deep neural network simultaneously for replay attack spoofing detection and replay noise classification. Multi-task learning was exploited to embed spoofing detection and replay noise classification in the code layer. The experiment results on the ASVspoof2017 datasets demonstrate that the performance of our proposed system is relatively improved 30% on the evaluation set.
- North America > United States > Massachusetts > Suffolk County > Boston (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Asia > South Korea > Seoul > Seoul (0.04)
Noise-Enhanced Associative Memories
Karbasi, Amin, Salavati, Amir Hesam, Shokrollahi, Amin, Varshney, Lav R.
Recent advances in associative memory design through structured pattern sets and graph-based inference algorithms have allowed reliable learning and recall of an exponential number of patterns. Although these designs correct external errors in recall, they assume neurons that compute noiselessly, in contrast to the highly variable neurons in hippocampus and olfactory cortex. Here we consider associative memories with noisy internal computations and analytically characterize performance. As long as the internal noise level is below a specified threshold, the error probability in the recall phase can be made exceedingly small. More surprisingly, we show that internal noise actually improves the performance of the recall phase. Computational experiments lend additional support to our theoretical analysis. This work suggests a functional benefit to noisy neurons in biological neuronal networks.
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.14)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Europe > Switzerland > Vaud > Lausanne (0.04)
- (3 more...)